Date | 8th, Mar 2018 |
---|
Last spring, a video made the internet rounds showing a team of academic and cybersecurity researchers demonstrating how they had hacked into an industrial robot arm and discreetly modified its processes.
The result was that the 216-pound arm performed its task slightly differently than it had been programmed to. While, in that case, the change was small, such a deviation has the potential to cause a major disruption to products on assembly lines where similar robots are used in industries ranging from food processing to aerospace.
The video highlighted an emerging vulnerability of the modern factory as more and more industrial systems join the internet of things to make it easier for factory managers to monitor and program the machines.
“Robots do more now than they ever have, and some companies are moving forward with, not just the assembly line robots, but freestanding robots that can actually drive around factory floors,” said Raheem Beyah, the Motorola Foundation Professor and interim Steve W. Chaddick School Chair in Georgia Tech’s School of Electrical and Computer Engineering. “So, in that type of setting, you can imagine how dangerous this could be if a hacker gains access to those machines. At a minimum, they could cause harm to whatever products are being produced. If it’s a large enough robot, it could destroy parts or the assembly line. In a worst-case scenario, it could injure or cause death to the humans in the vicinity.”
That realization has spurred computer security professionals to explore ways to prevent such attacks, and researchers at Georgia Tech have developed their own device to fight back.
It’s a robot small enough to fit in a shoe box. They have named it HoneyBot.
Internet security professionals long have employed decoy computer systems as a way to throw cyberattackers off the trail. Once hackers gain access to the decoy, they leave behind valuable information that can help companies further secure their networks. The decoys are known in the industry as “honeypots.”
“A lot of cyberattacks go unanswered or unpunished because there’s this level of anonymity afforded to malicious actors on the internet, and it’s hard for companies to say who is responsible,” said Celine Irvene, a Georgia Tech graduate student who worked with Beyah to devise the new robot. “Honeypots give security professionals the ability to study the attackers, determine what methods they are using, and figure out where they are or potentially even who they are.”
The research team applied the same concept to the HoneyBot. The gadget can be monitored and controlled through the internet. But unlike other remote-controlled robots, the HoneyBot’s special ability is tricking those operating it remotely into thinking it is performing one task, when, in reality, it is doing something completely different.
In a factory setting, such a robot could sit motionless in a corner, springing to life when a hacker gains access — a visual indicator to factory workers that a malicious actor is targeting the facility.
Rather than allow the hacker to then run amok in the physical world, the robot could be designed to follow certain commands deemed harmless — such as meandering slowly about or picking up objects — but stopping short of actually doing anything dangerous.
“The idea behind a honeypot is that you don’t want the attackers to know they’re in a honeypot,” Beyah said. “They’re always looking for indications that they’re in a virtual environment. So, we’d simulate any command that we’d know will be disruptive.”
For example, if the attackers instruct the robot to pick up something off a conveyor belt and throw it on the floor, instead the robot would simply turn and place the object back on the belt. To the hacker, however, the robot would send back data indicating it had thrown the object as instructed.
“If the attacker is smart and is looking out for the potential of a honeypot, maybe they’d look at different sensors on the robot, like an accelerometer or speedometer, to verify the robot is doing what it had been instructed,” Beyah said. “That’s where we would be spoofing that information as well. The hacker would see from looking at the sensors that the appropriate acceleration occurred from point A to point B. Further, several sensors may present the same type of information so we ensure that they are correlated.”
The researchers call the HoneyBot, which is partially funded with a grant from the National Science Foundation, a hybrid honeypot system — partially a real device responding in real life to commands, partially simulating those commands.
“The problem with a 100 percent simulated systems is that it makes it easier for a sophisticated hacker to determine that it’s not real,” Irvene said. “Let’s say the hacker tells the robot to perform a process that should take a few seconds or a minute to complete, such as opening a relay. In a simulated environment, the system would respond immediately that the relay is now open. And that might tip off the hacker.”
Having a real robot also gives the researchers a physical system to model for their simulations rather than just plucking numbers from the air.
To put the HoneyBot through its paces and test the concept, Beyah and Irvene devised an experiment to give volunteers the ability to control the robot across the internet and to navigate through a maze in their lab. The volunteers would be paid a small amount to complete the maze and would earn a little more money for finishing it quickly.
Like playing a video game, the volunteers would use a virtual interface to control the robot and would not be able to see what was happening in real life. To entice the volunteers to break the rules, at specific spots within the maze, the volunteers would see a forbidden “shortcut” that would allow them to finish the maze faster.
In the real maze back in the lab, no shortcut will exist, and if the volunteers opt to go through it, the robot instead will remain still. Meanwhile, the volunteers — who have now unwittingly become hackers for the purposes of the experiment — will see sensor data indicating they passed through the shortcut and continued along the maze.
“We want to make sure they feel that this robot was doing this real thing, and at the end we’re going to ask them if they knew that they were deceived at that ‘shortcut,’” Beyah said. “That will allow us to see if we’re on the right track.”
The growth of internet-connected things has enabled businesses and other organizations to collect data in ways that were out of reach just a few years ago.
But all of that data streaming in real time, often from disparate sources, can have an unintended consequence: How does an organization keep track of it all?
The Georgia Tech Police Department (GTPD) faced that problem a few years ago. With hundreds of cameras and other sensors placed throughout campus, the department needed a way to make it easier for officers to quickly access information, even when out on the beat.
Enter COP, or Common Operating Picture, a new data visualization interface developed in partnership with the Georgia Tech Research Institute (GTRI).
The new digital interface starts with a map of campus. Icons hover over the map showing the locations of cameras and squad cars. If an officer clicks on a building icon, a picture of the building appears with options to view camera feeds within the building and layouts of each floor.
The goal of the new interface is to better use the video feeds and sensor data during ongoing situations, said GTPD Chief Rob Connolly.
“If we have something like an active robbery, we want to catch them before they get off campus,” he said. “The way we used to use cameras was more forensic. This new tool will make it easier to access the camera feeds we need, when we need to them, to improve situational awareness and help us with our plan of action.”
As the number of cameras placed on campus grew through the years, dispatchers were tasked with juggling multiple computer programs to keep track of all the feeds. In a bid to simplify that, the department invested a few years ago in a third-party computer program that could incorporate multiple types of information in a single screen.
The new GTRI-made visualization tool builds on that, giving the department a custom-tailored program that can be expanded over time, said Leigh McCook, a Georgia Tech Research Institute (GTRI) principal research associate.
“We wanted a situational awareness product that we could own, and could use to build in capabilities we needed without relying on a third party,” said Jeff Hunnicutt, physical security specialist with the police department.
The team of researchers at GTRI, which in addition to McCook included Evan Stuart, Kristin Morgan, Trevor Goodyear, and Winston Messer, built an underlying system for the interface that also allows physical control of cameras, such as pan, tilt, and zoom functions.
One big improvement over the previous system is the ability to use the COP on mobile devices, giving officers in the field much more information at their fingertips. McCook said the COP platform could be adapted for other police forces and public safety agencies.
Alain Louchez
As companies across the planet respond to market demand and move forward with developing their own internet of things technologies and devices, Georgia Tech is also looking for ways to tie those companies together with researchers to help shape the industry.
“No company can have the complete knowledge base,” said Alain Louchez, managing director of Georgia Tech’s Center for the Development and Application of Internet of Things Technologies (CDAIT). “So, in some cases competitors become partners to help us tackle some of these really difficult issues.”
CDAIT (pronounced “sedate”) was launched in a bid to help bridge companies with Georgia Tech researchers at the forefront of answering those questions. Since 2014, more than 20 market-leading companies have joined the initiative and, through their active involvement on the board and in working groups, are helping to shape the center’s research activities. In addition, CDAIT has developed collaborative agreements around the world with nonprofit organizations dedicated to the advancement of the internet of things.
With such internet of things growth, it makes sense for companies to work together to address both technical questions — such as interoperability, device discovery, and security — as well as management and societal issues, including how the internet of things impacts business models, privacy, trust, ethics, regulation, and policy, Louchez explained.
Beyond helping companies refine their internet of things strategies, CDAIT also aims to answer fundamental questions about the internet of things, such as, “What defines an internet of things technology?”
“That is important, because if we don’t know where you want to go, we cannot tell you how to get there,” Louchez said. “It’s crucial to recognize the complexity of IoT, and get a clear idea of the whole IoT value chain with its numerous moving parts. Once you grasp what the internet of things entails, you can figure out how to plan products and services, and the implications of what you’re creating.”
CDAIT has established six working groups composed of researchers and industry leaders, who all have begun with a key challenge or goal. They are tackling topics that encompass internet of things security and privacy, technological standards, workforce development, the entrepreneurial ecosystem, overall internet of things issues and future hurdles, and specific internet of things research projects.
Louchez’s vision also includes the next internet of things phase, which will deal with the automation of many processes that today require a human operator to initiate.
“Traditionally, all devices needed to be interacting with humans at some point,” Louchez said. “Now, the big change is that with the internet of things, devices with various degrees of intelligence will be communicating with each other.”
That will be true with consumer appliances and industrial systems, generating even more demand for sorting out the internet of things landscape and answering lingering questions about policies.
“I think we’ve shown that there is a need for a group that can bring together so many stakeholders to identify, understand, and help solve some of the pivotal challenges tied to the fast-emerging IoT space,” Louchez said. “The fact that we have a host of premier global companies joining us suggests we may be on to something.”